1,543 research outputs found

    On the Intersection Property of Conditional Independence and its Application to Causal Discovery

    Full text link
    This work investigates the intersection property of conditional independence. It states that for random variables A,B,CA,B,C and XX we have that XX independent of AA given B,CB,C and XX independent of BB given A,CA,C implies XX independent of (A,B)(A,B) given CC. Under the assumption that the joint distribution has a continuous density, we provide necessary and sufficient conditions under which the intersection property holds. The result has direct applications to causal inference: it leads to strictly weaker conditions under which the graphical structure becomes identifiable from the joint distribution of an additive noise model

    Structural Intervention Distance (SID) for Evaluating Causal Graphs

    Full text link
    Causal inference relies on the structure of a graph, often a directed acyclic graph (DAG). Different graphs may result in different causal inference statements and different intervention distributions. To quantify such differences, we propose a (pre-) distance between DAGs, the structural intervention distance (SID). The SID is based on a graphical criterion only and quantifies the closeness between two DAGs in terms of their corresponding causal inference statements. It is therefore well-suited for evaluating graphs that are used for computing interventions. Instead of DAGs it is also possible to compare CPDAGs, completed partially directed acyclic graphs that represent Markov equivalence classes. Since it differs significantly from the popular Structural Hamming Distance (SHD), the SID constitutes a valuable additional measure. We discuss properties of this distance and provide an efficient implementation with software code available on the first author's homepage (an R package is under construction)

    Switching Regression Models and Causal Inference in the Presence of Discrete Latent Variables

    Get PDF
    Given a response YY and a vector X=(X1,…,Xd)X = (X^1, \dots, X^d) of dd predictors, we investigate the problem of inferring direct causes of YY among the vector XX. Models for YY that use all of its causal covariates as predictors enjoy the property of being invariant across different environments or interventional settings. Given data from such environments, this property has been exploited for causal discovery. Here, we extend this inference principle to situations in which some (discrete-valued) direct causes of Y Y are unobserved. Such cases naturally give rise to switching regression models. We provide sufficient conditions for the existence, consistency and asymptotic normality of the MLE in linear switching regression models with Gaussian noise, and construct a test for the equality of such models. These results allow us to prove that the proposed causal discovery method obtains asymptotic false discovery control under mild conditions. We provide an algorithm, make available code, and test our method on simulated data. It is robust against model violations and outperforms state-of-the-art approaches. We further apply our method to a real data set, where we show that it does not only output causal predictors, but also a process-based clustering of data points, which could be of additional interest to practitioners.Comment: 46 pages, 14 figures; real-world application added in Section 5.2; additional numerical experiments added in the Appendix

    A Homologous Series of Cobalt, Rhodium, and Iridium Metalloradicals

    Get PDF
    We herein present a series of d7 trimethylphosphine complexes of group 9 metals that are chelated by the tripodal tetradentate tris(phosphino)silyl ligand [SiP^(iPr)_3]H ([SiP^(iPr)_3] = (2_(-i)Pr_2PC_6H_4)_3Si^–). Both electron paramagnetic resonance (EPR) simulations and density functional theory (DFT) calculations indicate largely metalloradical character. These complexes provide a rare opportunity to compare the properties between the low-valent metalloradicals of the second- and third-row transition metals with the corresponding first-row analogues

    Distributional Robustness of K-class Estimators and the PULSE

    Full text link
    Recently, in causal discovery, invariance properties such as the moment criterion which two-stage least square estimator leverage have been exploited for causal structure learning: e.g., in cases, where the causal parameter is not identifiable, some structure of the non-zero components may be identified, and coverage guarantees are available. Subsequently, anchor regression has been proposed to trade-off invariance and predictability. The resulting estimator is shown to have optimal predictive performance under bounded shift interventions. In this paper, we show that the concepts of anchor regression and K-class estimators are closely related. Establishing this connection comes with two benefits: (1) It enables us to prove robustness properties for existing K-class estimators when considering distributional shifts. And, (2), we propose a novel estimator in instrumental variable settings by minimizing the mean squared prediction error subject to the constraint that the estimator lies in an asymptotically valid confidence region of the causal parameter. We call this estimator PULSE (p-uncorrelated least squares estimator) and show that it can be computed efficiently, even though the underlying optimization problem is non-convex. We further prove that it is consistent. We perform simulation experiments illustrating that there are several settings including weak instrument settings, where PULSE outperforms other estimators and suffers from less variability.Comment: 85 pages, 15 figure

    Invariant Causal Prediction for Sequential Data

    Full text link
    We investigate the problem of inferring the causal predictors of a response YY from a set of dd explanatory variables (X1,…,Xd)(X^1,\dots,X^d). Classical ordinary least squares regression includes all predictors that reduce the variance of YY. Using only the causal predictors instead leads to models that have the advantage of remaining invariant under interventions, loosely speaking they lead to invariance across different "environments" or "heterogeneity patterns". More precisely, the conditional distribution of YY given its causal predictors remains invariant for all observations. Recent work exploits such a stability to infer causal relations from data with different but known environments. We show that even without having knowledge of the environments or heterogeneity pattern, inferring causal relations is possible for time-ordered (or any other type of sequentially ordered) data. In particular, this allows detecting instantaneous causal relations in multivariate linear time series which is usually not the case for Granger causality. Besides novel methodology, we provide statistical confidence bounds and asymptotic detection results for inferring causal predictors, and present an application to monetary policy in macroeconomics.Comment: 55 page
    • …
    corecore